Sparse Coding Neural Gas: Learning of overcomplete data representations

نویسندگان

  • Kai Labusch
  • Erhardt Barth
  • Thomas Martinetz
چکیده

We consider the problem of learning an unknown (overcomplete) basis from data that are generated from unknown and sparse linear combinations. Introducing the Sparse Coding Neural Gas algorithm, we show how to employ a combination of the original Neural Gas algorithm and Oja’s rule in order to learn a simple sparse code that represents each training sample by only one scaled basis vector. We generalize this algorithm by using Orthogonal Matching Pursuit in order to learn a sparse code where each training sample is represented by a linear combination of up to k basis elements. We evaluate the influence of additive noise and the coherence of the original basis on the performance with respect to the reconstruction of the original basis and compare the new method to other state of the art methods. For this analysis, we use artificial data where the original basis is known. Furthermore, we employ our method to learn an overcomplete representation for natural images and obtain an appealing set of basis functions that resemble the receptive fields of neurons in the primary visual cortex. An important result is that the algorithm converges even with a high degree of overcompleteness. A reference implementation of the methods is provided 1 .

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Learning Data Representations with Sparse Coding Neural Gas

We consider the problem of learning an unknown (overcomplete) basis from an unknown sparse linear combination. Introducing the “sparse coding neural gas” algorithm, we show how to employ a combination of the original neural gas algorithm and Oja’s rule in order to learn a simple sparse code that represents each training sample by a multiple of one basis vector. We generalise this algorithm usin...

متن کامل

Soft-competitive learning of sparse codes and its application to image reconstruction

We propose a new algorithm for the design of overcomplete dictionaries for sparse coding, Neural Gas for Dictionary Learning (NGDL), which uses a set of solutions for the sparse coefficients in each update step of the dictionary. In order to obtain such a set of solutions, we additionally propose the bag of pursuits method (BOP) for sparse approximation. Using BOP in order to determine the coef...

متن کامل

Learning sparse codes for image reconstruction

We propose a new algorithm for the design of overcomplete dictionaries for sparse coding that generalizes the Sparse Coding Neural Gas (SCNG) algorithm such that it is not bound to a particular approximation method for the coefficients of the dictionary elements. In an application to image reconstruction, a dictionary that has been learned using this algorithm outperforms a dictionary that has ...

متن کامل

A Mixture Model for Learning Sparse Representations

In a latent variable model, an overcomplete representation is one in which the number of latent variables is at least as large as the dimension of the data observations. Overcomplete representations have been advocated due to robustness in the presence of noise, the ability to be sparse, and an inherent flexibility in modeling the structure of data [9]. In this report, we modify factor analysis...

متن کامل

Demixing Jazz-Music: Sparse Coding Neural Gas for the Separation of Noisy Overcomplete Sources

We consider the problem of separating noisy overcomplete sources from linear mixtures, i.e., we observe N mixtures of M > N sparse sources. We show that the “Sparse Coding Neural Gas” (SCNG) algorithm [8, 9] can be employed in order to estimate the mixing matrix. Based on the learned mixing matrix the sources are obtained by orthogonal matching pursuit. Using synthetically generated data, we ev...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Neurocomputing

دوره 72  شماره 

صفحات  -

تاریخ انتشار 2009